An efficient algorithm for rank-1 sparse PCA

نویسندگان

  • Yunlong He
  • Renato Monteiro
  • Haesun Park
چکیده

Sparse principal component analysis (PCA) imposes extra constraints or penalty terms to the original PCA to achieve sparsity. In this paper, we introduce an efficient algorithm to find a single sparse principal component with a specified cardinality. The algorithm consists of two stages. In the first stage, it identifies an active index set with desired cardinality corresponding to the nonzero entries of the principal component. In the second one, it finds the best direction with respect to the active index set, using the power iteration method. Experiments on both randomly generated data and real-world data sets show that our algorithm is very fast, especially on large and sparse data sets, while the numerical quality of the solution is comparable to other methods.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Generalised Scalable Robust Principal Component Analysis

The robust estimation of the low-dimensional subspace that spans the data from a set of high-dimensional, possibly corrupted by gross errors and outliers observations is fundamental in many computer vision problems. The state-of-the-art robust principal component analysis (PCA) methods adopt convex relaxations of `0 quasi-norm-regularised rank minimisation problems. That is, the nuclear norm an...

متن کامل

Sparse Principal Component Analysis via Regularized Low Rank Matrix Approximation

Principal component analysis (PCA) is a widely used tool for data analysis and dimension reduction in applications throughout science and engineering. However, the principal components (PCs) can sometimes be difficult to interpret, because they are linear combinations of all the original variables. To facilitate interpretation, sparse PCA produces modified PCs with sparse loadings, i.e. loading...

متن کامل

Sparse Additive Text Model with Low Rank Background

The sparse additive model for text modeling involves the sum-of-exp computing, whose cost is consuming for large scales. Moreover, the assumption of equal background across all classes/topics may be too strong. This paper extends to propose sparse additive model with low rank background (SAM-LRB) and obtains simple yet efficient estimation. Particularly, employing a double majorization bound, w...

متن کامل

Sparse Additive Text Models with Low Rank Background

The sparse additive model for text modeling involves the sum-of-exp computing, whose cost is consuming for large scales. Moreover, the assumption of equal background across all classes/topics may be too strong. This paper extends to propose sparse additive model with low rank background (SAM-LRB) and obtains simple yet efficient estimation. Particularly, employing a double majorization bound, w...

متن کامل

A Nonconvex Free Lunch for Low-Rank plus Sparse Matrix Recovery

We study the problem of low-rank plus sparse matrix recovery. We propose a generic and efficient nonconvex optimization algorithm based on projected gradient descent and double thresholding operator, with much lower computational complexity. Compared with existing convex-relaxation based methods, the proposed algorithm recovers the low-rank plus sparse matrices for free, without incurring any a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010